L1-norm Principal-Component Analysis in L2-norm-reduced-rank Data Subspaces

نویسندگان

  • Panos P. Markopoulos
  • Dimitris A. Pados
  • George N. Karystinos
  • Michael Langberg
چکیده

Standard Principal-Component Analysis (PCA) is known to be very sensitive to outliers among the processed data. On the other hand, in has been recently shown that L1-norm-based PCA (L1-PCA) exhibits sturdy resistance against outliers, while it performs similar to standard PCA when applied to nominal or smoothly corrupted data. Exact calculation of the K L1-norm Principal Components (L1-PCs) of a rank-r data matrix X ∈ RD×N costs O(2), in the general case, and O(N (r−1)K+1) when r is fixed with respect to N . In this work, we examine approximating the K L1-PCs of X by the K L1-PCs of its L2-norm-based rank-d approximation (K ≤ d ≤ r), calculable exactly with reduced complexity O(N (d−1)K+1). Reduced-rank L1-PCA aims at leveraging both the low computational cost of standard PCA and the outlier-resistance of L1-PCA. Our novel approximation guarantees and experiments on dimensionality reduction show that, for appropriately chosen d, reduced-rank L1-PCA performs almost identical to L1-PCA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Face Recognition with L1-norm Subspaces

We consider the problem of representing individual faces by maximum L1-norm projection subspaces calculated from available face-image ensembles. In contrast to conventional L2-norm subspaces, L1-norm subspaces are seen to offer significant robustness to image variations, disturbances, and rank selection. Face recognition becomes then the problem of associating a new unknown face image to the “c...

متن کامل

A pure L1L1-norm principal component analysis

The L1 norm has been applied in numerous variations of principal component analysis (PCA). L1-norm PCA is an attractive alternative to traditional L2-based PCA because it can impart robustness in the presence of outliers and is indicated for models where standard Gaussian assumptions about the noise may not apply. Of all the previously-proposed PCA schemes that recast PCA as an optimization pro...

متن کامل

Efficient l1-Norm-Based Low-Rank Matrix Approximations for Large-Scale Problems Using Alternating Rectified Gradient Method

Low-rank matrix approximation plays an important role in the area of computer vision and image processing. Most of the conventional low-rank matrix approximation methods are based on the l2 -norm (Frobenius norm) with principal component analysis (PCA) being the most popular among them. However, this can give a poor approximation for data contaminated by outliers (including missing data), becau...

متن کامل

L1-norm-based (2D)PCA

Traditional bidirectional two-dimension (2D) principal component analysis ((2D)PCA-L2) is sensitive to outliers because its objective function is the least squares criterion based on L2-norm. This paper proposes a simple but effective L1-norm-based bidirectional 2D principal component analysis ((2D)PCA-L1), which jointly takes advantage of the merits of bidirectional 2D subspace learning and L1...

متن کامل

Optimal Algorithms for L1-subspace Signal Processing

Abstract We describe ways to define and calculate L1-norm signal subspaces which are less sensitive to outlying data than L2-calculated subspaces. We start with the computation of the L1 maximum-projection principal component of a data matrix containing N signal samples of dimension D. We show that while the general problem is formally NP-hard in asymptotically large N , D, the case of engineer...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017